On the Convergence of Projected-Gradient Methods with Low-Rank Projections for Smooth Convex Minimization over Trace-Norm Balls and Related Problems
نویسندگان
چکیده
Smooth convex minimization over the unit trace-norm ball is an important optimization problem in machine learning, signal processing, statistics, and other fields that underlies many tasks which one wishes to recover a low-rank matrix given certain measurements. While first-order methods for enjoy optimal convergence rates, they require worst-case compute full-rank SVD on each iteration, order Euclidean projection onto ball. These computations, however, prohibit application of such large-scale problems. A simple natural heuristic reduce computational cost approximate using only SVD. This raises question if, under what conditions, this can indeed result provable solution. In paper we show any solution center inside projected-gradient mapping admits rank at most multiplicity largest singular value gradient vector point. Moreover, radius scales with spectral gap vector. We how readily implies local (i.e., from “warm-start" initialization) standard as method accelerated methods, computations. also quantify effect “over-parameterization," i.e., computations higher rank, ball, showing it increase dramatically moderately larger rank. extend our results setting smooth regularization bounded-trace positive semidefinite matrices. Our theoretical investigation supported by concrete empirical evidence demonstrates correct projections completion task real-world datasets.
منابع مشابه
Factor Matrix Trace Norm Minimization for Low-Rank Tensor Completion
Most existing low-n-rank minimization algorithms for tensor completion suffer from high computational cost due to involving multiple singular value decompositions (SVDs) at each iteration. To address this issue, we propose a novel factor matrix trace norm minimization method for tensor completion problems. Based on the CANDECOMP/PARAFAC (CP) decomposition, we first formulate a factor matrix ran...
متن کاملIntermediate Gradient Methods for Smooth Convex Problems with Inexact Oracle
Between the robust but slow (primal or dual) gradient methods and the fast but sensitive to errors fast gradient methods, our goal in this paper is to develop first-order methods for smooth convex problems with intermediate speed and intermediate sensitivity to errors. We develop a general family of first-order methods, the Intermediate Gradient Method (IGM), based on two sequences of coefficie...
متن کاملLow-rank optimization with trace norm penalty
The paper addresses the problem of low-rank trace norm minimization. We propose an algorithm that alternates between fixed-rank optimization and rank-one updates. The fixed-rank optimization is characterized by an efficient factorization that makes the trace norm differentiable in the search space and the computation of duality gap numerically tractable. The search space is nonlinear but is equ...
متن کاملthe evaluation of language related engagment and task related engagment with the purpose of investigating the effect of metatalk and task typology
abstract while task-based instruction is considered as the most effective way to learn a language in the related literature, it is oversimplified on various grounds. different variables may affect how students are engaged with not only the language but also with the task itself. the present study was conducted to investigate language and task related engagement on the basis of the task typolog...
15 صفحه اولRegularized gradient-projection methods for finding the minimum-norm solution of the constrained convex minimization problem
Let H be a real Hilbert space and C be a nonempty closed convex subset of H. Assume that g is a real-valued convex function and the gradient ∇g is [Formula: see text]-ism with [Formula: see text]. Let [Formula: see text], [Formula: see text]. We prove that the sequence [Formula: see text] generated by the iterative algorithm [Formula: see text], [Formula: see text] converges strongly to [Formul...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Siam Journal on Optimization
سال: 2021
ISSN: ['1095-7189', '1052-6234']
DOI: https://doi.org/10.1137/18m1233170